515 research outputs found

    Kolmogorov-Sinai entropy in field line diffusion by anisotropic magnetic turbulence

    Full text link
    The Kolmogorov-Sinai (KS) entropy in turbulent diffusion of magnetic field lines is analyzed on the basis of a numerical simulation model and theoretical investigations. In the parameter range of strongly anisotropic magnetic turbulence the KS entropy is shown to deviate considerably from the earlier predicted scaling relations [Rev. Mod. Phys. {\bf 64}, 961 (1992)]. In particular, a slowing down logarithmic behavior versus the so-called Kubo number R1R\gg 1 (R=(δB/B0)(ξ/ξ)R = (\delta B / B_0) (\xi_\| / \xi_\bot), where δB/B0\delta B / B_0 is the ratio of the rms magnetic fluctuation field to the magnetic field strength, and ξ\xi_\bot and ξ\xi_\| are the correlation lengths in respective dimensions) is found instead of a power-law dependence. These discrepancies are explained from general principles of Hamiltonian dynamics. We discuss the implication of Hamiltonian properties in governing the paradigmatic "percolation" transport, characterized by RR\to\infty, associating it with the concept of pseudochaos (random non-chaotic dynamics with zero Lyapunov exponents). Applications of this study pertain to both fusion and astrophysical plasma and by mathematical analogy to problems outside the plasma physics. This research article is dedicated to the memory of Professor George M. ZaslavskyComment: 15 pages, 2 figures. Accepted for publication on Plasma Physics and Controlled Fusio

    Transverse spin densities of octet baryons using Lattice QCD

    Get PDF
    We present results from the QCDSF/UKQCD collaboration for the transverse spin densities of octet baryons obtained from simulations using Nf = 2+1 flavours of O(a)-improved Wilson fermions. These densities are revealed through an analysis of the electromagnetic and tensor form factors of the octet baryons at two different lattice spacings with pion masses as low as 220 MeV. We find SU(3) flavour-breaking effects of the form factors and use these to extrapolate to the physical mass. Constructing combinations of Fourier transformed form factors reveal non-trivial spin densities in the transverse plane, with similar deformations across the baryon octet

    The prognosis of allocentric and egocentric neglect : evidence from clinical scans

    Get PDF
    We contrasted the neuroanatomical substrates of sub-acute and chronic visuospatial deficits associated with different aspects of unilateral neglect using computed tomography scans acquired as part of routine clinical diagnosis. Voxel-wise statistical analyses were conducted on a group of 160 stroke patients scanned at a sub-acute stage. Lesion-deficit relationships were assessed across the whole brain, separately for grey and white matter. We assessed lesions that were associated with behavioural performance (i) at a sub-acute stage (within 3 months of the stroke) and (ii) at a chronic stage (after 9 months post stroke). Allocentric and egocentric neglect symptoms at the sub-acute stage were associated with lesions to dissociated regions within the frontal lobe, amongst other regions. However the frontal lesions were not associated with neglect at the chronic stage. On the other hand, lesions in the angular gyrus were associated with persistent allocentric neglect. In contrast, lesions within the superior temporal gyrus extending into the supramarginal gyrus, as well as lesions within the basal ganglia and insula, were associated with persistent egocentric neglect. Damage within the temporo-parietal junction was associated with both types of neglect at the sub-acute stage and 9 months later. Furthermore, white matter disconnections resulting from damage along the superior longitudinal fasciculus were associated with both types of neglect and critically related to both sub-acute and chronic deficits. Finally, there was a significant difference in the lesion volume between patients who recovered from neglect and patients with chronic deficits. The findings presented provide evidence that (i) the lesion location and lesion size can be used to successfully predict the outcome of neglect based on clinical CT scans, (ii) lesion location alone can serve as a critical predictor for persistent neglect symptoms, (iii) wide spread lesions are associated with neglect symptoms at the sub-acute stage but only some of these are critical for predicting whether neglect will become a chronic disorder and (iv) the severity of behavioural symptoms can be a useful predictor of recovery in the absence of neuroimaging findings on clinical scans. We discuss the implications for understanding the symptoms of the neglect syndrome, the recovery of function and the use of clinical scans to predict outcome

    Norwich COVID-19 testing initiative pilot: evaluating the feasibility of asymptomatic testing on a university campus

    Get PDF
    Background: There is a high prevalence of COVID-19 in university-age students, who are returning to campuses. There is little evidence regarding the feasibility of universal, asymptomatic testing to help control outbreaks in this population. This study aimed to pilot mass COVID-19 testing on a university research park, to assess the feasibility and acceptability of scaling up testing to all staff and students. Methods: This was a cross-sectional feasibility study on a university research park in the East of England. All staff and students (5625) were eligible to participate. All participants were offered four PCR swabs, which they self-administered over two weeks. Outcome measures included uptake, drop-out rate, positivity rates, participant acceptability measures, laboratory processing measures, data collection and management measures. Results: 798 (76%) of 1053 who registered provided at least one swab; 687 (86%) provided all four; 792 (99%) of 798 who submitted at least one swab had all negative results and 6 participants had one inconclusive result. There were no positive results. 458 (57%) of 798 participants responded to a post-testing survey, demonstrating a mean acceptability score of 4.51/5, with five being the most positive. Conclusions: Repeated self-testing for COVID-19 using PCR is feasible and acceptable to a university population

    Designing for emergence and innovation: Redesigning design

    Get PDF
    We reveal the surprising and counterintuitive truth that the design process, in and of itself, is not always on the forefront of innovation. Design is a necessary but not a sufficient condition for the success of new products and services. We intuitively sense a connection between innovative design and emergence. The nature of design, emergence and innovation to understand their interrelationships and interdependencies is examined. We propose that design must harness the process of emergence; for it is only through the bottom-up and massively iterative unfolding of emergence that new and improved products and services are successfully refined, introduced and diffused into the marketplace. The relationships among design, emergence and innovation are developed. What designers can learn from nature about emergence and evolution that will impact the design process is explored. We examine the roles that design and emergence play in innovation. How innovative organizations can incorporate emergence into their design process is explored. We demarcate the boundary between invention and innovation. We also articulate the similarities and differences of design and emergence. We then develop the following three hypotheses: Hypothesis 1: “An innovative design is an emergent design.” Hypothesis 2: “A homeostatic relationship between design and emergence is a required condition for innovation.”Hypothesis 3: “Since design is a cultural activity and culture is an emergent phenomenon, it follows that design leading to innovation is also an emergent phenomenon” We provide a number of examples of how design and emergence have worked together and led to innovation. Examples include the tool making of early man; the evolutionary chain of the six languages speech, writing, math, science, computing and the Internet; the Gutenberg printing press and techniques of collaborative filtering associated with the Internet. We close by describing the relationship between human and naturally “designed” systems and the notion a key element of a design is its purpose as is the case with a living organism

    A Search for sub-km KBOs with the Method of Serendipitous Stellar Occultations

    Full text link
    The results of a search for sub-km Kuiper Belt Objects (KBOs) with the method of serendipitous stellar occultations are reported. Photometric time series were obtained on the 1.8m telescope at the Dominion Astrophysical Observatory (DAO) in Victoria, BC, and were analyzed for the presence of occultation events. Observations were performed at 40 Hz and included a total of 5.0 star-hours for target stars in the ecliptic open cluster M35 (beta=0.9deg), and 2.1 star-hours for control stars in the off-ecliptic open cluster M34 (beta=25.7deg). To evaluate the recovery fraction of the analysis method, and thereby determine the limiting detectable size, artificial occultation events were added to simulated time series (1/f scintillation-like power-spectra), and to the real data. No viable candidate occultation events were detected. This limits the cumulative surface density of KBOs to 3.5e10 deg^{-2} (95% confidence) for KBOs brighter than m_R=35.3 (larger than ~860m in diameter, assuming a geometric albedo of 0.04 and a distance of 40 AU). An evaluation of TNO occultations reported in the literature suggests that they are unlikely to be genuine, and an overall 95%-confidence upper limit on the surface density of 2.8e9 deg^{-2} is obtained for KBOs brighter than m_R=35 (larger than ~1 km in diameter, assuming a geometric albedo of 0.04 and a distance of 40 AU) when all existing surveys are combined.Comment: Accepted for publication in A

    Is spoken language all-or-nothing? Implications for future speech-based human-machine interaction

    Get PDF
    Recent years have seen significant market penetration for voice-based personal assistants such as Apple’s Siri. However, despite this success, user take-up is frustratingly low. This article argues that there is a habitability gap caused by the inevitablemismatch between the capabilities and expectations of human users and the features and benefits provided by contemporary technology. Suggestions aremade as to how such problems might be mitigated, but a more worrisome question emerges: “is spoken language all-or-nothing”? The answer, based on contemporary views on the special nature of (spoken) language, is that there may indeed be a fundamental limit to the interaction that can take place between mismatched interlocutors (such as humans and machines). However, it is concluded that interactions between native and non-native speakers, or between adults and children, or even between humans and dogs, might provide critical inspiration for the design of future speech-based human-machine interaction

    Back to the Future: The Uses of Television in the Digital Age

    Get PDF
    This article considers some of the present-day issues, challenges and possibilities facing television broadcasting via a critical examination of the recently published Goldsmiths report on the future of public service television in the twenty-first century. Focusing mainly on UK terrestrial broadcasters (BBC, ITV, Channel 4 and Channel 5), the article summarises and expands on the report's key findings and recommendations, particularly in relation to questions concerning digitalisation, content, diversity, quality, marketisation, funding and national and regional heritage. The article argues that, despite the rise of the Internet and the proliferation of digital platforms, television viewing remains a common source of information and entertainment and is characterised by meaningful continuities. Additionally, the article outlines the vitally important role played by David Puttnam, chair of the Goldsmiths inquiry, in defending public service television through his active engagement with relevant parliamentary committees and as a widely respected media professional. Finally, the article reflects on the continuing relevance of the 1962 Pilkington Report on Broadcasting, which was similarly commissioned in order to evaluate the purposes of television. In so doing, the article suggests that Pilkington's criticisms of creeping commercialism and the ensuing regulatory proposals still represent a cogent engagement with the idea of public service broadcasting as a primary facilitator of deliberative democracy
    corecore